Module 3 Activity Research

Weekly Activity Template

Yawen Qiao


Project 3


Module 3

We have now reached the final project of this semester. I feel our projects have made significant strides throughout the term. In the previous project, I focused primarily on projection-related design, aiming to achieve a more three-dimensional visual effect. In the project before that, we established our main research direction. This semester's project further incorporated interactive features, allowing users to engage with our installation and alter the projected content. This also represents our attempt to integrate design with fine art within the project. In this endeavor, we not only explored the design of projection content and interactive installations but also developed websites to test whether we could achieve our envisioned presentation effects. Personally, this project has provided me with valuable insights into the design philosophy and mindset of installation art, while also offering us a precious opportunity to experience human-machine interaction.

Workshop 1

This is the human-computer interaction concept we explored in our first workshop. In class, our group drew the theme “Smart Kitchen,” so we divided tasks and designed AI-related features for various kitchen appliances. These features enable the appliances to provide feedback or assist users in operation. This may represent a future trend in AI development—empowering users to more easily operate complex kitchen equipment. <a href='https://youtube.com/shorts/6HBpUUo2JLM?feature=share' target='_blank'><p>Project Video Link</p></a>

Activity 1: My Research

This is the paper model with two sensors connected, showing the door closed. This is the paper model with two sensors connected, showing the door open. This shows the two sensors connected to Adriano. One is an ultrasonic sensor, and the other is a distance-detecting sensor. This is a schematic diagram of the principle. This is a test projection effect diagram, but we did not use this image as the final display.

Activity 2: My Research

This is a paper model I created for various scenarios. This is a Christmas tree website I generated using the real Gemini, which can transform into particle shapes. This is a meteor shower website I generated using Gemini, which can create numerous meteor showers. This is the test effect and code demonstration. Page showing Arduino connected to Touchdesigner.

Additional Research or Workshops

This is the particle effect displayed in the actual projection. This is how the final Christmas tree appears projected onto the paper model, creating a three-dimensional visual effect. This is a demonstration image from the initial attempts at house projection. Code demonstration Website generation process demonstration

Project 2


Project 3 Final Prototype

This is the final product from our third project. It detects people entering or exiting through two different sensors. Its triggering principle works like this: if the sequence triggered is 1-2, it counts as an entry; if the sequence is 2-1, it counts as an exit. As the number of people increases, the projected content changes accordingly. We chose a Christmas tree as the projection. Initially, the tree appears as scattered particles. As more people enter, the particle tree gradually coalesces. Once a certain threshold is reached, the tree precisely projects onto our prepared 3D paper model, achieving a fully dimensional effect. For this project, we primarily utilized software tools including Arduino, Touchdesigner, and Gemini to support our work.

The setup shown here features Touchdesigner on the left. The paper on the table serves as the trigger device for the projection. Behind the videographer, the paper model and projected content are displayed in the foreground.
×

Powered by w3.css